Robust Divergence Measures for Time Series Discrimination∗
نویسنده
چکیده
New divergence measures are introduced for change detection and discrimination of stochastic signals (time series) on the basis of parametric filtering — a technique that combines parametric linear filtering with correlation characterization. The sensitivity of these divergence measures is investigated using local curvatures under additive and multiplicative spectral departure models. It is found that when the time series contain dominant spectral components (such as sharp peaks and notches) of similar characteristics, the new divergence measures are more effective in detecting the dominated spectral deviations than the conventional spectral divergence measures such as the Kolmogorov-Smirnov spectral distance and the Kullback-Leibler spectral divergence. Simulation results are given that illustrate and confirm the theoretical findings. Key wards and phrases: Characterization; Discriminant Analysis; Distortion Measure; Filter Bank; Pattern Recognition; Signal Processing; Spectral Analysis
منابع مشابه
Seven Means, Generalized Triangular Discrimination, and Generating Divergence Measures
Jensen-Shannon, J-divergence and Arithmetic-Geometric mean divergences are three classical divergence measures known in the information theory and statistics literature. These three divergence measures bear interesting inequality among the three non-logarithmic measures known as triangular discrimination, Hellingar’s divergence and symmetric chi-square divergence. However, in 2003, Eve studied ...
متن کاملSome Inequalities Among New Divergence Measures
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Burbea-Rao [1] Jensen-Shannon divegernce and Taneja [8] arithmetic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence m...
متن کاملDiscrimination of time series based on kernel method
Classical methods in discrimination such as linear and quadratic do not have good efficiency in the case of nongaussian or nonlinear time series data. In nonparametric kernel discrimination in which the kernel estimators of likelihood functions are used instead of their real values has been shown to have good performance. The misclassification rate of kernel discrimination is usually less than ...
متن کاملRobust Estimation in Linear Regression Model: the Density Power Divergence Approach
The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...
متن کاملGeneralized Symmetric Divergence Measures and Metric Spaces
Abstract Recently, Taneja [7] studied two one parameter generalizations of J-divergence, Jensen-Shannon divergence and Arithmetic-Geometric divergence. These two generalizations in particular contain measures like: Hellinger discrimination, symmetric chi-square divergence, and triangular discrimination. These measures are well known in the literature of Statistics and Information theory. In thi...
متن کامل